-
Notifications
You must be signed in to change notification settings - Fork 10.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add Falcon3 model support #10864
Add Falcon3 model support #10864
Conversation
@mokeddembillel heads up, this has been reverted because the change to |
@slaren @ggerganov, Thanks for flagging this. working on fixing it right now. |
@slaren @ggerganov Thanks again for flagging this issue. The issue is that when using meta-llama/Llama-3.1-8B-Instruct the <|begin_of_text|> token is added to every special token when doing the screenshot shows before and after I'm fixing this by adding to be extra safe, we will use Generation before the fix:
Generation after the fix:
Created new PR with the Fix #10883 |
This reverts commit 382bc7f.
Adding Falcon3 model support